LibTiff.net - Save Directory - save

I have massive tiff file that contains 8 directories (resolutions). It's also a tiled.
I can cycle thru the directories and get the resolution of each. I want to save the 4th directory to a new tif file. I think it's possible but can't get my hands on it.
Basically want to do this:
using (LibTiff.Classic.Tiff image = LibTiff.Classic.Tiff.Open(file, "r"))
{
if (image.NumberOfDirectories() > 4) {
image.SetDirectory(4);
image.WriteDirectory("C:\\Temp\Test.tif");
}
}
It would be so nice if that was possible but I know I have to create an output image and copy the rows of data into it. Not sure how yet. Any help would be much appreciated.

There are no built-in methods in LibTiff.Net library that can be used to copy one directory into a new file.
The task is quite complex and the best place to start is to look at TiffCP utility's source code.
The utility no only can copy images but it can also extract directories.

Related

How to replace an entire file using FileManager

Let's say I have an UIImage cached in my Cache folder:
/.../Cache/Image Cache/<firstImage.id>
Now I want this folder to only ever have 10 image cached at a time, so if a new one comes in I want to take a file and replace the entire file and not just the contents of it. I.e.
/.../Cache/Image Cache/<firstImage.id> becomes
/.../Cache/Image Cache/<secondImage.id>.
As far as I can tell, replaceItem(at:withItemAt:backupItemName:options: only replaces the contents of the file but the file name remains the same. And I'm not too sure what replaceItem(at:withItemAt:backupItemName:options:resultingItemURL:) does even though it might be what I'm looking for (I don't know what an AutoreleasingUnsafeMutablePointer<NSURL?>? was but it sounded dangerous so I decided to leave it alone, specially since it has the word "unsafe" in it).
Is there a straightforward way of doing using an in-built function or is manually deleting the old file and adding the new file the best way? Please let me know.
Thanks in advance!

Using ROOTFS_POSTPROCESS_COMMAND to add function that copies files

What I used to do this was use ROOTFS_POSTPROCESS_COMMAND variable to add my own shell script functions.
I needed to append the petalinux-user-image in meta-plnx-generated so in my meta-user layer, I created the following file: petalinux-user-image.bbappend:
inherit core-image
ROOTFS_POSTPROCESS_COMMAND += "my_install_function; "
my_install_function(){
echo "hello" > ${IMAGE_ROOTFS}/hello.txt
}
What I am trouble with is how do I add files to the ${IMAGE_ROOTFS}. I can remove/move files/create files, but can't seem to copy files from my meta-user layer to the ${IMAGE_ROOTFS}, like with normal recipes where I can install files. The ${WORKDIR} points the rootfs folders in build, and ${THIS_DIR} seems to point to the petalinux-user-image in meta-plnx-generated. I have given the meta-user layer a higher priority than the meta-plnx-generated layer, so task order is correct.
Help or ideas would be appreciated, thanks.
The general answer is that you're doing this backwards. The best practice here would be to write recipes for the additional files you want in your image and include those packages in your image. The ROOTFS_POSTPROCESS_COMMAND hook is intended for minor content tweaks.

Alter DICOM tags without saving

I'm using EvilDicom to grab DICOM data from my DB and transfer it out to a directory where it can be used another program. The secondary program checks in for new files periodically but I need to change a DICOM tag before it does.
I could have a temp location, change my tag, then resave it but I would rather change it while it is in memory and write it directly where it needs to go. I can't seem to figure out how to do that within the EvilDicom API.
Any suggestions?
(Following the basic code in "EvilDICOM in ESAPI" youtube video)
Take a look at the FileWriterSCP class. Just change the DIMSEService.CStorePayloadAction action which gives you the DICOM file in memory.
DIMSEService.CStorePayloadAction = (dcm, asc) =>
{
//DO STUFF WITH dcm variable HERE
}
The cleanest way is to not manipulate in memory because you rely on EvilDICOM's SCP to be robust, and since I made it, I can tell you its just "pretty good" ;) I would use a DICOM SCP like Varian's FileDaemon to catch and write files and then change them once they are on the hard-drive.

scilab - Writing multiple images to a single folder

I am working in Scilab 5.5.2. I need to write multiple images to a single folder. The images which I want to write are cropped images from a set of inputs. I am able to write a single image to the folder with the following command:
imwrite(fname,strcat('C:\Users\dell\Desktop\example_sci\myfolder\1.jpg'));
I have put this in a for loop, so the output image is over written and the result is a single image.
How can i write all the results to a single folder?
You should create the paths like this.
for j=1:10
pathname = "C:\Users\dell\Desktop\example_sci\myfolder\"+ string(j) + ".jpg"
xs2jpg(gcf(), pathname);
end
I usually use the xs2*-command to export a chosen figure. (I use scilab 5.5.0, which does not contain imwrite anymore.)

Does speed of tar.gz file listing depend on tar size?

I am using the tf function to list the contents of a tar.gz file. It is pretty large ~1 GB. There are around 1000 files organized in a year/month/day file structure.
The listing operation takes quit a bit of time. Seems like a listing should be fast. Can anyone enlighten me on the internals?
Thanks -
Take a look at wikipedia, for example, to verify that each file inside the tar is preceed by a header. To verify all files inside the tar, is necessary to read the whole tar.
There's no "index" in the beggining of the tar to indicate it's contents.
Tar has simple file structure. If you want list them, you must parse all file.
If you want find one file, you can stop process. But must be sure archive has only one file version. This is typical on packed archives because adding on that is unsupported.
for example you can do like this:
tar tvzf somefile.gz|grep for find something|\
while read file; do foundfile="$file"; last; done
at this loop will break and do not read everything, but only from start to file position.
If you must do something more with list, save it to any temporary file. you can gzip this file for place saving if it is needed:
tar tvzf somefile.gz|gzip >temporary_filelist.gz

Resources