Create new folder named with a specified file name in Automator - automator

I want to create a new folder named with the filename of the file used for the input.
Example:
If I use my new service with the file "test (test).jpg" I want to automatically create a folder named "test (test)".

To do this with regular Automator actions it gets a bit convoluted, since you need to save the original input, get the name, make the folder, get the original input back, etc. You can use a Run AppleScript action to do most of that in one go, although it depends on what you are wanting to do with the original input and created folder paths.
The following Run AppleScript action will create new folders with the name of the input items (note that a service can pass multiple items), and just passes the original input on. The new folders are created in the same parent folder - no error handling (duplicate names, etc) is performed:
on run {input, parameters} -- make new folders from base file names
set output to {}
repeat with anItem in the input -- step through each item in the input
set anItem to anItem as text
tell application "System Events" to tell disk item anItem
set theContainer to path of container
set {theName, theExtension} to {name, name extension}
end tell
if theExtension is in {missing value, ""} then
set theExtension to ""
else
set theExtension to "." & theExtension
end if
set theName to text 1 thru -((count theExtension) + 1) of theName -- the name part
tell application "Finder"
make new folder at folder theContainer with properties {name:theName}
set end of output to result as alias
end tell
end repeat
return input -- or output
end run

Related

Copy folders into Azure data storage (azure data factory)

I am trying to copy folders with their files from ftp into an azure data storage, by looping through the folders and for each folder copy the content into a container that has the folder's name. for this, I used a metadata ,for each and copy data component. For now I am able to copy all the folders into the same container , but what I want is to have multiple containers named after the the folders in the output, each one containing files from the ftp.
ps : I am still new to azure data factory
Any advise or help is very welcome :)
You need to add a Get Metadata activity before the for-each. The Get Metadata activity will get the files in the current directory and pass them to the For-Each. You connect it to your Blob storage folder.
try something like this
Setup a JSON source:
Create a pipeline, use GetMetadata activity to list all the folders in the container/storage. Select fields as childItems
Feed the Metadata output (list of container contents) into filter activity and filter only folders.
Input the list of folders to a ForEach activity
Inside ForEach, set the current item() to a variable, and use it as a parameter for a parameterized source dataset which is a clone of original source !
This would result in listing the files from each folder in your container.
Feed this to another filter and this time filter on files. Use #equals(item().type,'File')
Now create another pipeline where we will have our copy activity running for each file found to be having same name as that of its parent folder.
Create parameters in the new child pipeline to receive the current Folder and File name in the iteration from Parent Pipeline to evaluate for copy.
Inside child pipeline, start with foreach whose input will be the list of filenames inside the folder received into parameter: #pipeline().parameters.filesnamesreceived
Use variable to hold the current item and use IfCondition to check if filename and folder names match.
Note: Try to evaluate dropping the file extension as per your requirement as metadata would hold the complete file name along with
its extension.
If True - > the names match, copy from source to sink.
Here the hierarchy is preserved and you can also use "Prefix" to mention the file path as it copies with preserving hierarchy. It utilizes the service-side filter for Blob storage, which provides better performance than a wildcard filter.
The sub-path after the last "/" in prefix will be preserved. For example, you have source container/folder/subfolder/file.txt, and configure prefix as folder/sub, then the preserved file path is subfolder/file.txt. Which fits your scenario.
This copies files like /source/source/source.json to /sink/source/source.json
AzCopy is simplest solution for this than Data factory, dry run can be used to check which files/folders will be copied
az storage blob copy start \
--destination-container destContainer \
--destination-blob myBlob \
--source-account-name mySourceAccount \
--source-account-key mySourceAccountKey \
--source-container myContainer \
--source-blob myBlob

Copy Files from one folder to another folder

I am reading the filename from the excel spreadsheet and iterating over the files present in a specified folder. If the file name in the folder matches with the filename in the excel sheet the message "Yes" is entered in the second column of the excel sheet otherwise "No" is entered. But when the filenames are not equal "No" is not shown in the excel spreadsheet it is left as blank. I am also trying to copy the files whose file names are matched to another folder but I am getting some error.
First thing, it would be more efficient and simpler if you use If file exists command.
If file exists ($filepath$) Then
Set Cell B$Counter$ = Yes
Copy file command
Else
Set Cell B$Counter$ = No
The error happened because you are looping on files on a folder and the counter is for the file loop not for the excel.

How to filter and get folder with latest date in google dataflow

I am passing in an wilcard match string as gs://dev-test/dev_decisions-2018-11-13*/. And i am passing to TextIO as below.
p.apply(TextIO.read().from(options.getLocalDate()))
Now i want to read all folders from the bucket named dev-test and filter and only read files from the latest folder. Each folder has a name with timestamp appended to it.
I am new to dataflow and not sure how would I go about doing this.
Looking at the JavaDoc here it seems as though we can code:
String folder = // The GS path to the latest/desired folder.
PCollection<String> myPcollection = p.apply(TextIO.Read.from(folder+"/*")
The resulting PCollection will thus contain all the text lines from all the files in the specified folder.
Assuming you can have multiple folders in the same bucket with the same date prefix/suffix as for example "data-2018-12-18_part1", "data-2018-12-18_part2" etc, the following will work. Its a python example but it works for Java as well. You will just need to get the date formatted as per your folder name and construct the path accordingly.
# defining the input path pattern
input = 'gs://MYBUCKET/data-' + datetime.datetime.today().strftime('%Y-%m-%d') + '*\*'
(p
| 'ReadFile' >> beam.io.ReadFromText(input)
...
...
it will read all the files from all the folders matching the pattern
If you know that the most recent folder will always be today's date, you could use a literal string as in Tanveer's answer. If you don't know that and need to filter the actual folder names for the most recent date, I think you'll need to use FileIO.match to read file and directory names, and then collect them all to one node in order to do figure out which is the most recent folder, then pass that folder name into TextIO.read().from().
The filtering might look something like:
ReduceByKey.of(FileIO.match("mypath"))
.keyBy(e -> 1) // constant key to get everything to one node
.valueBy(e -> e)
.reduceBy(s -> ???) // your code for finding the newest folder goes here
.windowBy(new GlobalWindows())
.triggeredBy(AfterWatermark.pastEndOfWindow())
.discardingFiredPanes()
.output()

iup.getparam in Lua -- prompting for a directory

I'm writing a Lua program that must prompt the user for a directory as one of a number of parameters for an operation (that involves copying a file to a target directory with a new name). Environment is Windows; I'm using Lua 5.1.
The relevant code currently looks like
require("iuplua")
local mediaFolder = "C:\some folder\some subfolder\"
local pPrompt = --this is a subset of the parameters
"File name: %s\n"..
"Destination: %f[DIR||"..mediaFolder.."]\n"
ret, strTargetFile, strTargetPath =
iup.GetParam("Add Media from file ", param_action, pPrompt, "Initial file name", mediaFolder)
The resultant GUI looks like:
but when the selector button (...) is pressed, the initial directory shown is not C:\some folder\some subfolder\ but whatever directory was last navigated to in the interface, and it isn't possible to select a directory, only a file.
I'm guessing I have a fundamental misunderstanding of how this should work? Is what I want to do possible with iup? Ideally, I'd also like to restrict the user to only selecting the initial directory or one of its sub-directories rather than navigating anywhere outside that directory structure, and to allow the user to create a new sub-folder.
This looks like a bug. I'll check it.
Don't know if Stack Overflow is a place for bug reports, but I monitor iup posts here.
Best

How do I change flat file source format in SSIS when it's inside a foreach loop?

My flat file connection manager has no file name selected because it's inside a foreach loop. With no file name selected, I can't add any new columns. These questions don't have the "no file name" issue, so they didn't work for me:
Add extra external column to flat file source
How to change flat file source using foreach loop container in an SSIS package?
How do I add a new column to my file format?
Here are my flat file connection manager and foreach loop:
Just select a file name in your Abc flat file connection manager, then setup the structure.
If you're using a foreach loop container to cycle through filenames, then the file you choose won't matter -- it's used for design only.
If you get a run-time error because the file doesn't exist, then change the DelayValidation property in your Abc connection manager to True.

Resources