I have a large sized log file approx. 72,493 KB.
While opening it even in notepad its taking 5-6 seconds, and more than 20 minutes in my delphi7 application.
I want to split the file in Delphi-7 treeview and want it to load step wise, e.g. on click on ..more.. more details from that log file should be displayed in delphi treeview.
Please let me know the possible ways for this.
Thank you.
Related
I am creating a .pst file manually by opening it in the outlook and add/copying its own folders containing mail items , just to exceed its size.
But after adding and exceeding its size and closing it from the outlook and close the Outlook itself, it was showing the same size as it has earlier.
For example:
i have " 1 GB " of pst file , i want to exceed its size to " 4 GB "by opening it in outlook and copying its own folder data or folder into itself.
but it was failed and showing same size i.e 1 GB after closing it from the outlook.
I just want to know why the file size not exceeded ?
if anyone of you tried this earlier, let me know.
I have a few large txt files I'm trying to load into a data warehouse, I do get an error message with the offending row/line number but cannot open the txt file to review it as says it's 2 large 2,413,060KB. Someone suggested using the cmd option to do this but unsure how.
You can either use HJSplit in order to split the data or using another vim-based program such as gVim. Also it may be good to free up your PC's RAM as much as possible.
I was asked to develop a game called "Flag Quiz" in which the player have to guess the correct name of the flag that appears in the middle of the screen.
Of course I have a lot of pictures (221 flags) and I have to put them inside the program because, when the button Play is pressed, the program has to pick randomly 10 of these flags.
Problem
I was thinking to use an ImageList but the flags are 480x311 and so Delphi asks me to separate the picutre in 30 different bitmaps. Can I do anything about this?
My idea, to avoid that problem, was the following (although I think that it's not very good): create 221 TPicture components (invisible to the user of course) and load in each of them a picture of a flag.
I'd prefer not using the last idea I had. Do you know any improvement?
This sort of problem is simply not suited to the form designer. You want to store 221 images, and managing that in the IDE will be horrible. Once you've got them all in you won't be able to see them readily because they will be base 16 encoded in a .dfm file. Under revision control it will be a mess because you won't be able to change individual images in a manageable and traceable manner.
The accepted way to do this is to use resources. If it were me, I'd arrange for my images to have predicatable names. For instance, flag1, flag2, etc. I'd generate a resource script (.rc) that listed all the flags. I'd compile that resource script to a compiled resource (.res) which is linked to the executable. I'd have the resource script and the image files committed to revision control.
Then at runtime you have a single TImage control to display the flag. Every time you need a new image you load it with TResourceStream, and push it into the TImage control.
Devexpress has a Componnect named cximagecolletion that you can put your images on it and save and load images from/to file
or you can save all flags in small access db and load it when you need using tadodataset
there is no doubt that if you put your images direct on your form your dfm grow very high and so you Get Into trouble
Personally I would store each file as an image in a dedicated subdirectory, using the country as a file name. Then I would read the subdirectory file names on entry to the program (so I have a list of countries that I can randomly choose from) and use TImage.LoadFromFile to display the flag. This is far easier to extend than using a resource file (IMHO).
In my app i want the user to be able to download offline map content.
So I (compressed) moved all my tiles into a zip file. (I used 0 compression)
The structure is like that: {z/x/y.jpg}
+0
+-0
+--0.jpg
+1
+-1
+--0.jpg
+2
+-2
+--1.jpg
So basically there are going to be many many files for zoom level 0-15. (about 120.000 tiles for my test-region).
I am using https://github.com/mattconnolly/ZipArchive now but also tried out https://github.com/soffes/ssziparchive before and both are pretty slow. It takes about 5!! minutes on my iPhone 5S for the files to unzip.
Is there any way I can speed things up? What other possibilities rather than downloading the tiles in one big zip file would there be?
Edit:
How can i download the content of the whole folder quickly to my iPhone without the need of unzipping something?
Any help is appreciated!
JPGs rarely compress at all with zip - they are by definition already compressed. What you should do is create your own binary file format, and put whatever metadata you need into it along with the images (which you should encode with a really low quality number, to get their size down).
When you download those files, you can open then, quickly read them into memory, and extract out data or images as needed.
This will be really fast and have virtually no overhead if your extra data is binary (not text).
PS: I just tripped on a PHP Plist class
If anyone is wondering how I was ending up:
For my use-case (MapTiles) I am using MBTiles now instead of zipped images. It's one big database file and super easy to read if using FMDB. No unpacking whatsoever needed...
Even if I was placing the Images all in one binary file without any compression, the "extracting" still took forever!
I have a spreadsheet that has several worksheets, each containing a hundred or so hyperlinks to other documents on our network (file:////server/share/path/to/spreadsheet.xls). Opening these files within Excel 2003 takes a very long time, compared to how long it takes to open in 2007 or 2010. I took a look in Task Manager at the network tab to see what was going on when these files are opened and noticed a lot of steady network traffic while the file was opening, and as soon as the spreadsheet finally displayed on the screen, the network traffic dropped down to almost nothing. As I removed some of those links, the file would open faster, until I finally got rid of all the links and the file opened almost as fast as any other normal spreadsheet. Is there a way to prevent Excel from doing whatever it's doing with those links when the file first opens?
This is a programming-related forum so:
Workbooks.Open AFile, False
See: http://msdn.microsoft.com/en-us/library/ff194819.aspx