Script to rename '../foo' to asset_path('foo') - ruby-on-rails

This is not limited to Rails developers, but I would assume it's pretty common to have to do this since Rails 3.1.
I'm looking for a script/some method of replacing everything of the form
'../foo/BLAHBLAH'
to <%= asset_path 'BLAHBLAH' %>
where foo is the name of the asset type, so it can be either images or fonts.
Anyone have experience with this?

You can do this with a global search and replace.
In Textmate you can hit Command-Shift-F to enter a project wide search. Then search for \.\.\/images\/(.*?)[\)'"]and replace it with <%= asset_path('$1') %>
With find and sed it's a simple one liner:
find PROJECT_DIR -type f -name "*.html" -exec sed -i -e 's/\.\.\/images\/\([^)'\''"]*\)/<%= asset_path("\1")/g' {} \;
And in Vim you can do:
:args ./**
:argdo %s/\.\.\/images\/(.*?)[\)'"]/<%= asset_path('\1')/g

Related

For certain files in a directory carry out an action

I haven't worked with this stuff in years, so please be patient!
I'm having some really weird issues with Mac Excel greying out some .csv files but not others. From what I've read so far, this could have something to do with some of the more hidden file parameters.
Anyways, I'd like to find the files with a certain name in the directory, do a getfileinfo on them and spit out the result, i.e. something like:
for each i in (ls \*_xyz*.csv) do getfileinfo $i | echo
(or whatever more intelligent way this can be accomplished these days...)
I tried a few combinations but keep getting "-bash syntax error", so I've decided it's time to get help...
Thanks!!
Create dummy test files:
$ touch file{1..10}_xyz.csv
$ ls
file10_xyz.csv file1_xyz.csv file2_xyz.csv file3_xyz.csv file4_xyz.csv file5_xyz.csv file6_xyz.csv file7_xyz.csv file8_xyz.csv file9_xyz.csv
There are many ways to do this. My favorite is method1.
Method 1)
$ find . -name "*xyz*.csv" -exec someCommand {} \;
Method2)
$ for x in $( find . -name "*xyz*.csv") ; do someCommand $x ; done
Method3)
$find . -name "*xyz*.csv" | xargs someCommand

Combine grep -v with grep -r?

I want to remove an entire line of text from all files in a given directory. I know I can use grep -v foo filename to do this one file at a time. And I know I can use grep -r foo to search recursively through a directory. How do I combine these commands to remove a given line of text from all files in a directory?
The UNIX command to find files is named find, not grep. Forget you ever heard of grep -r as it's just a bad idea, here's the right way to find files and perform some action on them:
find . -type f -print | xargs sed -i '/badline/d'
Try something like:
grep -vlre 'foo' . | xargs sed -i 's/pattern/replacement/g'
Broken down:
grep:
-v 'Inverse match'
-l 'Show filename'
-r 'Search recursively'
-e 'Extended pattern search'
xargs: For each entry perform
sed -i: replace inline
I think this would work:
grep -ilre 'Foo' . | xargs sed -i 'extension' 'Foo/d'
Where 'extension' refers to the addition to the file name. It will make a copy of the original file with the extension you designated and the modified file will have the original filename. I added -i in case you require it to be case insensitive.
modified file1 becomes "file1"
original file1 becomes "file1extension"
invalid command code ., despite escaping periods, using sed
One of the responses suggests that the newer version of sed's -i option in OSX is slightly different so you need to add an extension. The file is being interpreted as a command, which is why you are seeing that error.

How to use grep to search only in a specific file types?

I have a lot of files and I want to find where is MYVAR.
I'm sure it's in one of .yml files but I can't find in the grep manual how to specify the filetype.
grep -rn --include=*.yml "MYVAR" your_directory
please note that grep is case sensitive by default (pass -i to tell to ignore case), and accepts Regular Expressions as well as strings.
You don't give grep a filetype, just a list of files. Your shell can expand a pattern to give grep the correct list of files, though:
$ grep MYVAR *.yml
If your .yml files aren't all in one directory, it may be easier to up the ante and use find:
$ find -name '*.yml' -exec grep MYVAR {} \+
This will find, from the current directory and recursively deeper, any files ending with .yml. It then substitutes that list of files into the pair of braces {}. The trailing \+ is just a special find delimiter to say the -exec switch has finished. The result is matching a list of files and handing them to grep.
If all your .yml files are in one directory, then cd to that directory, and then ...
grep MYWAR *.yml
If all your .yml files are in multiple directories, then cd to the top of those directories, and then ...
grep MYWAR `find . -name \*.yml`
If you don't know the top of those directories where your .yml files are located and want to search the whole system ...
grep MYWAR `find / -name \*.yml`
The last option may require root privileges to read through all directories.
The ` character above is the one that is located along with the ~ key on the keyboard.
find . -name \*.yml -exec grep -Hn MYVAR {} \;

How can I grep hidden files?

I am searching through a Git repository and would like to include the .git folder.
grep does not include this folder if I run
grep -r search *
What would be a grep command to include this folder?
Please refer to the solution at the end of this post as a better alternative to what you're doing.
You can explicitly include hidden files (a directory is also a file).
grep -r search * .[^.]*
The * will match all files except hidden ones and .[^.]* will match only hidden files without ... However this will fail if there are either no non-hidden files or no hidden files in a given directory. You could of course explicitly add .git instead of .*.
However, if you simply want to search in a given directory, do it like this:
grep -r search .
The . will match the current path, which will include both non-hidden and hidden files.
I just ran into this problem, and based on #bitmask's answer, here is my simple modification to avoid the problem pointed out by #sehe:
grep -r search_string * .[^.]*
Perhaps you will prefer to combine "grep" with the "find" command for a complete solution like:
find . -exec grep -Hn search {} \;
This command will search inside hidden files or directories for string "search" and list any files with a coincidence with this output format:
File path:Line number:line with coincidence
./foo/bar:42:search line
./foo/.bar:42:search line
./.foo/bar:42:search line
./.foo/.bar:42:search line
To prevent matching . and .. which are not hidden files, you can use grep with ls -A like in this example:
ls -A | grep "^\."
^\. states that the first character must be .
The -A or --almost-all option excludes the results . and .. so that only hidden files and directories are matched.
You may want to use this approach, assuming you're searching the current directory (otherwise replace . with the desired directory):
find . -type f | xargs grep search
or if you just want to search at the top level (which is quicker to test if you're trying these out):
find . -type f -maxdepth 1 | xargs grep search
UPDATE: I modified the examples in response to Scott's comments. I also added "-type f".
To search within ONLY all hidden files and directories from your current location:
find . -name ".*" -exec grep -rs search {} \;
ONLY all hidden files:
find . -name ".*" -type f -exec grep -s search {} \;
ONLY all hidden directories:
find . -name ".*" -type d -exec grep -rs search {} \;
All the other answers are better. This one might be easy to remember:
find . -type f | xargs grep search
It finds only files (including hidden) and greps each file.
To find only within a certain folder you can use:
ls -al | grep " \."
It is a very simple command to list and pipe to grep.
In addition to Tyler's suggestion, Here is the command to grep all files and folders recursively including hidden files
find . -name "*.*" -exec grep -li 'search' {} \;
You can also search for specific types of hidden files like so for hidden directory files:
grep -r --include=*.directory "search-string"
This may work better than some of the other options. The other options that worked can be too slow.

Searching HTML files in a directory for text

Ok, I'm very new to programming but am understanding how to conceptualize and talk about what I want and need to learn and find better.
Right now I am working with a directory /Food and have .html pages that I've downloaded from several sites.
I'd like to create a script to basically use the directory /Food and all files in this folder and its sub-directories, and compare the text for files that contain the same strings I input.
So something like:
commandforsearchingtextfiles [option for directory]/food *.[or command for all files following this directory path]
salt (string1)
sugar (string 2)
flour (string 3)
echo results
The results/output should be the files that contain the strings... and if you can add extra ideas on how to organize the output
Again, if this is covered, please just point me in the right locations of where to learn about this but if you have any quick advice or a quick script, that would be great too.
You on linux? Or could use cygwin (if on windows)?
... if so the basic linux commands would cope with this pretty well.
eg to search for all files containing salt...
find Food/ -type f -name "*.html" -print0 | xargs -0 grep salt
can narrow/widen the search by adding more switches to the various commands, eg case insensitive:
find Food/ -type f -name "*.html" -print0 | xargs -0 grep -i salt
or just the filenames (not the matched text)
find Food/ -type f -name "*.html" -print0 | xargs -0 grep -l salt
for more check "grep -h".
Multi-word phrases are possible
find Food/ -type f -name "*.html" -print0 | xargs -0 grep -i "the quick brown fox"
But there is an added complication - HTML itself doesnt care about whitespace, so the phrase could be split over multiple lines. Which means the whitespace in the documents could be different to your search. eg the above wont match
the quick
brown fox
but tis valid html. Use Regex to workaround that...
find Food/ -type f -name "*.html" -print0 | xargs -0 grep -iE "the[[:space:]]+quick[[:space:]]+brown[[:space:]]+fox"
but its starting to get messy.
You could put this in a .sh to so dont have to type all of that.
eg
#!/usr/bin/sh
find Food/ -type f -name "*.html" -print0 | xargs -0 grep -i "$*"
which when saved as a file. And made executable, can be just run to run a test searc
find_in_food salt
will display a list of filenames.
(this is of course barely touching the surface of whats possible with this!)

Resources