Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have installed Adobe Livecycle in order to convert MSWORD files to PDF from a web-app.
Specifically, I use the DocConverter tool. Previously I have used OpenOffice UNO SDK, but I have found some problems with particular documents.
Now, the conversion is ok, but the conversion time is huge.
These are the times to convert documents of different sizes via Openoffice and via Livecycle.
Could you suggest anything?
SIZE (bytes) Openoffice (sec) Adobe LiveCycle (sec)
24064 1 8
50688 0 3
100864 0 3
253952 0 5
509440 1 5
1017856 5 18
2098688 8 10
4042240 19 45
6281216 0 9
8212480 32 125
The main reason behind the length of time behind the transaction is how LiveCycle PDF Generator works. In order to maintain document fidelity, the document is programmatically opened using whatever native software the original document was created in, and then converted to PDF.
Consider increasing your application server heap size, and LiveCycle Document Max Inline Size. Here are two helpful articles:
Heap Size Tuning and Large Document Processing
I learnt that Adobe live cycle perform better and faster in windows rather than Solaris. Conversion of Document to PDF is always single threaded which ends up consuming more resources. Make sure to increase the heap size, Timeouts and it would serve better if load balancing is done with multiple servers.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I need to store plenty of data points (timeseries) coming from a sensor reach device (SensorTag)
Is there any recommended framework to store plenty of fast streaming data?
What type of local storage system do you recommend, sql, file, else?
Details
- Data comes in at 25hz per second
- Each row might have 70 bytes worth of data
- It's a continuous capture for 12 hours straight
When I did something similar with a BTLE device, I used Core Data with one new managed object instance per reading. To avoid excessive Core Data work, I didn't save changes after every new instance-- I'd save at intervals, after 100 new unsaved readings were available.
You might need to tune the save interval, depending on details like how much data the new entries actually have, what else is happening in your app at the time, and what device(s) you support. In my case I was updating an OpenGL view to show a 3D visualization of the data in real time.
Whatever you choose, make sure it lets you get the readings out of memory quickly. 25Hz * 70 bytes * 12 hours is a little over 75Mb. You don't want that all in RAM if you can avoid it.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
Sorry if this has been addressed before, I searched it up and I couldn't seem to find a question like this. I am making a social media app, and I want to preserve the quality of the images as much as possible. To make a data representation of the image I am using "UIImageJPEGRepresentation" and I would like to know what an optimal setting for the compression quality is.
One big consequence of setting the quality below 1 is that the image uploads to my server in much less time. I have experimented with the compression and to me I can't really tell the difference between 0.6 and 1 unless I zoom in on a computer, but I just wanted to know if there was a number or range that would produce favorable results.
One of the interesting and evolving format is Webp format that Google has introduced. This article suggests that Facebook is also trying to use this format. To answer the exact question as to how much to compress please note the following:
-Format of the image(I assume yours is JPEG)
-Compression technique(Lossy or Lossless)
-Target devices(I assume yours is mobile)
Considering above parameters(and more) and looking at the dimensions of all social networking sites I suggest you to get the compression level high enough such that you can see a difference in the image quality on the computer, that way you would have found an optimal level, remember lower the better till it degrades. Additionally you can find the information at this.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Apple states that an iOS App binary file can be as large as 2 GB, but the executable file (app_name.app/app_name) cannot exceed 60 MB? Does that mean that on the App Store, when I download my app that it can be as large as 2 GB? Or is it required that my app be less than 60 MB including all my images and video?
If my app including images and video must be less than 60 MB, how can I retrieve the rest of my assets?
The .app file portion is limited to 60 MB and consist largely of your compiled code. When combined with the assets (images, sounds, videos, data, etc.) in an .ipa file, the maximum size is 2 GB.
Note that these limits change over time. See: Max size of an iOS application
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I know it sounds crazy but I'm sitting here with various phone on different networks and all the phones on the sprint network are failing to work correctly.
It looks as if the LESS style sheet is not being applied. Has any one ever ran into any thing like this?
Also just visited the LESS website figuring all there styles would be created with LESS and its doing the same thing. Failing to load/apply the LESS.
The specific phones I have tried on the sprint network are two iphones and one android optimus V.
I would strongly suggest pre-processing your LESS file into CSS and serving that on your site.
It is considerably more efficient as even a medium sized less file can take hundreds of milliseconds to process during page load. It is also one less javascript file to download in production. Finally, it is a lot less processor overhead on mobile devices that need to not only parse the javascript, but then parse the less file as well. Some mobile devices don't have caching or local storage so there is a potential for them to be re-processing the less file every time the page loads.
Either use the lessc compiler (requires Node.js) or SimpLESS to pre-compile your css files.
You could try to see if it's blocking the download of the less javascript or the stylesheet itself by loading their URLs directly.
For lesscss.org that would be http://lesscss.org/less/main.less and http://lesscss.org/js/less.js
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
I am looking for a component or other technique to compress and encrypt multiple large files (files which exceed 4gb in size, and thus will not fit in the memory available to a win32 process) into a single file. I would like the encryption to be very strong (256 bit AES or better) but the compression doesn't matter to me.
Right now, I'm using the TJvZlibMultiple component which creates its own non-Zip file format, but I have to create the archive, then encrypt it in a separate step (I'm using DcpCrypt right now). I'd like to do the same in a single pass, using streams, without using memory equal to the size of the file (ergo, the compression and encryption should happen with streams, and not in memory).
I have seen, and don't want to use, anything that requires an external DLLs like the 7zip dll. Commercial tools are okay, or any code or sample Delphi sources, but I'm looking for a thorough implementation within Delphi not a thing that imports and invokes functions in a dll.
Take a look at DIZipWriter.
Supports 256 bit AES, streaming and compression.
Update:
Version 5 claims to have support for large sized (64bit) entries, DIZipWriter History.
You could take a look at FlexCompress by ComponentAce: http://www.componentace.com/compression_component_compression_delphi_encryption_delphi_flexcompress.htm
Check FlexCompress:
FlexCompress is a compression delphi component designed for creating archives with strong encryption and better compression rate than WinZip and RAR. Native VCL, no DLLs, no OCXs, provides compression for files, buffers, streams and strings, supports in-memory archives, compresses large files > 4 Gb with low memory consumption and lots of other useful features.
http://www.componentace.com/flexcompress_features.htm
by this link: http://delphi.about.com/od/toppicks/tp/delphi_zip.htm