TCPDF taking way longer time then FPDF - tcpdf

I am generating a reporting using both FPDF and TCPDF for comparison sack. What I notice is FPDF took nearly 40seconds but TCPDF took nearly 3minutes and 20seconds. Any reason why so big difference among them?

That's my experience, too. For my PDFs, TCPDF takes 7 - 10 times longer. I strongly recommend FPDF unless you need the features that TCPDF has that FPDF doesn't have. This list make be shorter than you think. See my book (free):
http://basepath.com/PDFbook

Related

Is Marshal.ReleaseComObject really necessary when using Microsoft.Office.Interop.Excel?

I have a mid sized code library (several thousand lines) that uses Excel Interop (Microsoft.Office.Interop.Excel).
The program that keeps a workbook open for hours at a time, and does manipulations like adding/editing text, shapes, and calling macros.
I have not once seen a Marshal.ReleaseComObject. Yet, the users don't report any problems.
In all cases, the objects go out of scope within several seconds.
So, is this a problem? How? If yes, how do I justify to management that it needs cleanup? If not, why recommend it in the first place?
It's been a while, but I did a lot of Excel automation from .NET. I never used Marshal.ReleaseComObject either. Never saw a problem.

Xls (csv) or xml for rails importing data

I need to import data to my app, now i do it via xls spreadsheets, but when in my case it has about 80.000 rows it is slow, so maybe is it better to chose another format? For example, will xml data be more faster in importing?
XML is unlikely to be any faster - it still needs to be parsed as strings and converted.
80,000 rows is quite a lot. How long does it take you?
Edit:
You can make what's happening more visible by dropping puts statements into your code, with timestamps. It's crude, but you can then time between various parts of your code to see which part takes the longest.
Or better yet, have a go at using ruby-prof to profile your code and see where the code is spending the most amount of time.
Either way, getting a more detailed picture of the slow-points is a Good Idea.
You may find there's just one or two bottlenecks that can be easily fixed.

Delphi printing primer

I need to add printing capabilities to an app and I have been looking around for information about printing. Logical/physical sizes, dpi, font scaling, etc, lots to digest since I never programmed printing into any app before.
Are there any sites that would offer a primer on the topics of page sizes, margins and all the other elements required to understand printing on Windows? I've been looking around for a while but what I find is either cryptic or years old...
I've been playing around with TPrinter, but I would like to build solid printing functionalities and understand what I'm doing better.
Using a report solution is not an option, even though I'm sure it would provide better results much sooner.
Two links to get you started:
Printing with TPrinter
Printing via the TPrinter Canvas
I think that you are looking too lowlevel.
Try looking at the build reporting tools (Rave or whatever is in your product).
Personally i am using a product called Report Builder from Digital Metaphors.
But if you want to do the lowlevel stuff lot og good information can be found at efg's computer lab - printing
Well, I have done things a variety of ways in the past, including the "hard way" with TPrinter. In fact, I recently had to do that again to run a special inventory label printer.
On the other hand, sometimes you are better off taking work others have done and using it for your benefit. I agree that ReportSmith isn't so great, and also it's Delphi (and Windows) specific. Using Excel or Word has those limitations, plus the fact that the user has to actually have them installed.
One thing I have done to make printing easy for some simple applications is just to generate an HTML file and call the user's web browser, then they can print it. HTML tables can be created relatively easily for numerical data, and you can include photos, etc. as well. This works well for some applications, and works on every platform where a web browser is installed. The downside, of course, is that HTML isn't the most precise layout language.
The version of Delphi you´re using is important. A number of Delphis came with print engines like ReportSmith (ugh). Another option thinking laterally is to use MS Word as a print engine. I´ve hooked into instances of Word & Excel before & utilised their functionality. As to raw printing using TPrinter or the print method of TForm you´d have to be pretty desperate. I seem to recall the Pacheo / Texeira Delphi books coming with a pretty good overview so you might want to see if you can find a copy of that somewhere.

Which is the best import / export LaTeX tool?

Working in academia publishing CS/math, you sooner or later find yourself trying to publish in a journal that will only accept .doc/.rtf. This means tedious, boring hours of translating line after line, especially equations, from LaTeX to an inferior format. Over the years I have tried a number of export tools for LaTeX, but none, at least of the free ones, that I have been very satisfied with. I'd like this page to collect and monitor the best import/export tools for LaTeX, to .doc/.rtf, or to other useful (e.g. HTML, MATHML) formats.
Thus, what is your one favorite import or export LaTeX tool?
AFAIK there isn't really a convenient and effective way to achieve what you're trying to do. What I usually do in those rare occasions is that I export to pdf, then select all the text, and paste into word. It's horrible and messes things up and of course doesn't adjust your citations.
To this day I don't understand how people writing in scientific fields can write and publish in Word. It is common in some human-computer interaction literature but I have not seen it in other conferences and journals. May I ask which one it is?
Also, some places, once you've already been accepted, will be willing to accept a PDF if you push it with them. You may have to make little adjustment yourself. Negotiations sometimes work on this.
The UK TeX FAQ has been collecting answers on this for quite some time now. :)
See Conversion from (La)TeX to HTML and Other conversions to and from (La)TeX. There is another FAQ specifically about Converters between LaTeX and PC Textprocessors maintained by Wilfried Hennings.
For LaTeX to HTML there are LaTeX2HTML, TtH, Tex4ht, TeXpider and Hevea; in my experience TeX4ht is the best. For LaTeX to Word, you can go through RTF with TeX2RTF (not so good), or through Adobe Acrobat which can produce PDF that Word can read (not good either), or go through HTML as above, but best is to use tex4ht which can generate OpenOffice ODT format, from which conversion to Word is easy.
The UK TeX FAQ also has many other useful things; you should take a look.

Best Free Text Editor Supporting *More Than* 4GB Files? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I am looking for a text editor that will be able to load a 4+ Gigabyte file into it. Textpad doesn't work. I own a copy of it and have been to its support site, it just doesn't do it. Maybe I need new hardware, but that's a different question. The editor needs to be free OR, if its going to cost me, then no more than $30. For Windows.
glogg could also be considered, for a different usage:
Caveat (reported by Simon Tewsi in the comments, Feb. 2013)
One caveat - has two search functions, Main Search and Quick Find.
The lower one, which I assume is Quick Find, is at least an order of magnitude slower than the upper one, which is fast.
I've had to look at monster(runaway) log files (20+ GB). I used hexedit FREE version which can work with any size files. It is also open source. It is a Windows executable.
Jeff Atwood has a post on this here: http://www.codinghorror.com/blog/archives/000229.html
He eventually went with Edit Pad Pro, because "Based on my prior usage history, I felt that EditPad Pro was the best fit: it's quite fast on large text files, has best-of-breed regex support, and it doesn't pretend to be an IDE."
Instead of loading a gigantic log file in an editor, I'm using Unix command line tools like grep, tail, gawk, etc. to filter the interesting parts into a much smaller file and then, I open that.
On Windows, try Cygwin.
Have you tried context editor? It is small and fast.
I Stumbled on this post many times, as I often need to handle huge files (10 Gigas+).
After being tired of buggy and pretty limited freeware, and not willing to pay fo costly editors after trial expired (not worth the money after all), I just used VIM for Windows with great success and satisfaction.
It is simply PERFECT for this need, fully customizable, with ALL feature one can think of when dealing with text files (searching, replacing, reading, etc. you name it)
I am very surprised nobody answered that (Except a previous answer but for MacOS)...
For the record I stumbled on it on this blog post, which wisely adviced it.
It's really tough to handle a 4G file as such. I used to handle larger text files, but I never used to load them in to my editor. I mostly used UltraEdit in my previous company, now I use Notepad++, but I would get just those parts which i needed to edit. (Most of the cases, the files never needed an edit).
Why do u want to load such a big file in to an editor? When I handled files of these size, I used GNU Core Utils. The most common operations i performed on those files were head ( to get the top 250k lines etc ), tail, split, sort, shuf, uniq etc. It's really powerful.
There's a lot of things you can do with GNU Core Utils. I would definitely recommend those, instead of a new editor.
Sorry to post on such an old thread, but I tried several of the tips here, and none of them worked for me.
It's slightly different than a text editor, but I found that Beyond Compare could handle an extremely large (3.6 Gig) file on my Vista 32-bit machine.
This is a file that that Emacs, Large Text File Viewer, HexEdit, and Notepad++ all choked on.
-Eric
My favourite after trying a few to read a 6GB mysqldump file:
PilotEdit Lite http://www.pilotedit.com/
Because:
Memory usage has (somehow?!) never gone above 25MB, so basically no impact on the rest of my system - though it took several minutes to open.
There was an accurate progress bar during that time so I knew how it was getting on.
Once open, simple searching, and browsing through the file all worked as well as a small notepad file.
It's free.
Others I tried...
EmEditor Pro trial was very impressive, the file opened almost instantly, but unfortunately too expensive for my requirements.
EditPad Pro loaded the whole 6GB file into memory and slowed everything to a crawl.
For windows, unix, or Mac? On the Mac or *nix you can use command line or GUI versions of emacs or vim.
For the Mac: TextWrangler to handle big files well. I'm not versed enough on the Windows landscape to help out there.
f you just want to view a large file rather than edit it, there are a couple of freeware programs that read files a chunk at a time rather than trying to load the entire file in to memory. I use these when I need to read through large ( > 5 GB) files.
Large Text File Viewer by swiftgear http://www.swiftgear.com/ltfviewer/features.html
Big File Viewer by Team Walrus.
You'll have to find the link yourself for that last one because the I can only post a maximum of one hyperlink being a newbie.
When I'm faced with an enormous log file, I don't try to look at the whole thing, I use Free File Splitter
Admittedly this is a workaround rather than a solution, and there are times when you would need the whole file. But often I only need to see a few lines from a larger file and that seems to be your problem too. If not, maybe others would find that utility useful.
A viewer that lets you see enormous text files isn't much help if you are trying to get it loaded into Excel to use the Autofilter, for example. Since we all spend the day breaking down problems into smaller parts to be able to solve them, applying the same principle to a large file didn't strike me as contentious.
HxD -- it's a hexeditor, but it allows in place edits, and doesn't barf on large files.
Tweak is a hex editor which can handle edits to very large files, including inserts and deletes.
EmEditor should handle this. As their site claims:
EmEditor is now able to open even larger than 248 GB (or 2.1 billion lines) by opening a
portion of the file with the new custom bar - Large File Controller.
The Large File Controller allows you to specify the beginning point,
end point, and range of the file to be opened. It also allows you to
stop the opening of the file and monitor the real size of the file and
the size of the temporary disk available.
Not free though..
I found that FAR commander could open large files ( I tried 4.2 GB xml file)
And it does not load the entire file in memory and works fast.
Opened 5GB file (quickly) with:
1) Hex Editor Neo
2) 010 editor
Textpad also works well at opening files that size. I have done it many times when having to deal with extremely large log files in the 3-5gb range. Also, using grep to pull out the worthwhile lines and then look at those works great.
The question would need more details.
Do you want just to look at a file (eg. a log file) or to edit it?
Do you have more memory than the size of the file you want to load or less?
For example, TheGun, a very small text editor written in assembly language, claims to "not have an effective file size limit and the maximum size that can be loaded into it is determined by available memory and loading speed of the file. [...] It has been speed optimised for both file load and save."
To abstract the memory limit, I suppose one can use mapped memory. But then, if you need to edit the file, some clever method should be used, like storing in memory the local changes, and applying them chunk by chunk when saving. Might be ineffective in some cases (big search/replace for example).
I have had problems with TextPad on 4G files too. Notepad++ works nicely.
Emacs can handle huge file sizes and you can use it on Windows or *nix.
What OS and CPU are you using? If you are using a 32-bit OS, then a process on your system physically cannot address more than 4GB of memory. Since most text editors try to load the entire file into memory, I doubt you'll find one that will do what you want. It would have to be a very fancy text editor, that can do out-of-core processing, i. e. load a chunk of the file at a time.
You may be able to load such a huge file with if you use a 64-bit text editor on a computer with a 64-bit CPU and a 64-bit operating system. And you have to make sure that you have enough space in your swap partition or your swap file.
Why do you want to load a 4+ GB file into memory? Even if you find a text editor that can do that, does your machine have 4 GB of memory? And unless it has a lot more than 4 GB in physical memory, your machine will slow down a lot and go swap file crazy.
So why do you want a 4+ GB file? If you want to transform it, or do a search and replace, you may be better off writing a small quick program to do it.
I also like notepad++.

Resources