jsPDF - multiple pages - rendering HTML always going to page1 - jspdf

We use jsPDF 2.5.1 to render a multi page PDF.
We use the html function to render various DOM elements to each page, and this was working in version 1.x of jsPDF
However now every time we call .html() - it puts it on the first page, rather than the newly added page, here is the code
if (pdfPageIndex < numPdfPages) {
if (pdfPageIndex > 0) {
pdf.addPage();
}
pdf.html(
document.getElementById('pdfPage_' + pdfPageIndex),
{
html2canvas: {
logging: true
},
callback: function(){ return pdfCallback($scope)}});

Please try with below addImage method based on canvas height and width it will render in multiple pages
const data = document.getElementById('pdfPage_');
html2canvas(data).then((canvas:any) => {
const imgWidth = 208;
const pageHeight = 295;
const imgHeight = (canvas.height * imgWidth) / canvas.width;
let heightLeft = imgHeight;
let position = 0;
heightLeft -= pageHeight;
const doc = new jspdf('p', 'mm');
doc.addImage(canvas, 'PNG', 0, position, imgWidth, imgHeight, '', 'FAST');
while (heightLeft >= 0) {
position = heightLeft - imgHeight;
doc.addPage();
doc.addImage(canvas, 'PNG', 0, position, imgWidth, imgHeight, '', 'FAST');
heightLeft -= pageHeight;
}
doc.save('Downld.pdf');
});

Related

html2canvas out of memory

Good day, colleagues. I have the problem, html2canvas when generating screenshots I give the error "out of memory" in google chrome.
window.jsPDF = window.jspdf.jsPDF;
let printArea = [...document.querySelectorAll('.print:not([style*="display: none;"])')]
var doc = new jsPDF('l', 'mm', "a4");
let tasks = printArea.map(tab => html2canvas(tab, {scale:2, removeContainer: true}))
Promise.all(tasks).then(canvases =>
{
var boolAdd = false
console.log(canvases)
for (const canvas of canvases)
{
if(true == boolAdd) {
doc.addPage();
}
let imgData = canvas.toDataURL('image/jpeg', 0.6);
const pageHeight = doc.internal.pageSize.getHeight();
const imgWidth = doc.internal.pageSize.getWidth();
var imgHeight = canvas.height * imgWidth / canvas.width;
var heightLeft = imgHeight - 20;
var position = 10;
doc.addImage(imgData, 'JPEG', 20, position, imgWidth - 40, imgHeight);
heightLeft -= pageHeight;
position += heightLeft - imgHeight; // top padding for other pages
doc.addImage(imgData, 'PNG', 0, position, imgWidth, imgHeight, undefined,'FAST');
heightLeft -= pageHeight;
boolAdd = true
}
console.log("report")
doc.save( 'report.pdf');
})
how to fix the error? Or how to free up memory? Alternatively, you can offer another library if you can't make it so that there are no memory problems.
Example
https://embed.plnkr.co/plunk/Zz4iFK
700 > mb
Good day, Comrads. I solved my problem. I replaced library html2canvas on htmlToImage witch method toCanvas. –

Distorting images using FabricJS filters and custom controls, by dragging the corner control points image resizes from center

I have created a subclass in Fabric.js 4.3.0 extending fabric.Image, this helps me change the render function so that image will always fit in the bounding box.
I have also created a custom filter for Fabric, using which, by giving 4 corner coordinates, I can distort the image, similar to Photoshop's free transform -> distort tool.
While my code works, the issue is that when I drag the corner controls, the image always resizes from center, moving the other controls points as well.
I am trying to follow the instructions on how to resize objects in fabric using custom control points, the instructions own on polygons, and other shapes, but it does not yield the result required with images.
The result that I want to achieve, is when dragging one of the green control points, the image should distort, but image and the other control points must stay in their own positions without moving, similar to what you see here: https://youtu.be/Pn-9qFNM6Zg?t=274
Here is a JSFIDDLE for the demo: https://jsfiddle.net/human_a/p6d71skm/
fabric.textureSize = 4096;
// Set default filter backend
fabric.filterBackend = new fabric.WebglFilterBackend();
fabric.isWebglSupported(fabric.textureSize);
fabric.Image.filters.Perspective = class extends fabric.Image.filters.BaseFilter {
/**
* Constructor
* #param {Object} [options] Options object
*/
constructor(options) {
super();
if (options) this.setOptions(options);
this.applyPixelRatio();
}
type = 'Perspective';
pixelRatio = fabric.devicePixelRatio;
bounds = {width: 0, height: 0, minX: 0, maxX: 0, minY: 0, maxY: 0};
hasRelativeCoordinates = true;
/**
* Array of attributes to send with buffers. do not modify
* #private
*//** #ts-ignore */
vertexSource = `
precision mediump float;
attribute vec2 aPosition;
attribute vec2 aUvs;
uniform float uStepW;
uniform float uStepH;
varying vec2 vUvs;
vec2 uResolution;
void main() {
vUvs = aUvs;
uResolution = vec2(uStepW, uStepH);
gl_Position = vec4(uResolution * aPosition * 2.0 - 1.0, 0.0, 1.0);
}
`;
fragmentSource = `
precision mediump float;
varying vec2 vUvs;
uniform sampler2D uSampler;
void main() {
gl_FragColor = texture2D(uSampler, vUvs);
}
`;
/**
* Return a map of attribute names to WebGLAttributeLocation objects.
*
* #param {WebGLRenderingContext} gl The canvas context used to compile the shader program.
* #param {WebGLShaderProgram} program The shader program from which to take attribute locations.
* #returns {Object} A map of attribute names to attribute locations.
*/
getAttributeLocations(gl, program) {
return {
aPosition: gl.getAttribLocation(program, 'aPosition'),
aUvs: gl.getAttribLocation(program, 'aUvs'),
};
}
/**
* Send attribute data from this filter to its shader program on the GPU.
*
* #param {WebGLRenderingContext} gl The canvas context used to compile the shader program.
* #param {Object} attributeLocations A map of shader attribute names to their locations.
*/
sendAttributeData(gl, attributeLocations, data, type = 'aPosition') {
const attributeLocation = attributeLocations[type];
if (gl[type + 'vertexBuffer'] == null) {
gl[type + 'vertexBuffer'] = gl.createBuffer();
}
gl.bindBuffer(gl.ARRAY_BUFFER, gl[type+'vertexBuffer']);
gl.enableVertexAttribArray(attributeLocation);
gl.vertexAttribPointer(attributeLocation, 2, gl.FLOAT, false, 0, 0);
gl.bufferData(gl.ARRAY_BUFFER, data, gl.STATIC_DRAW);
}
generateSurface() {
const corners = this.perspectiveCoords;
const surface = verb.geom.NurbsSurface.byCorners(...corners);
const tess = surface.tessellate();
return tess;
}
/**
* Apply the resize filter to the image
* Determines whether to use WebGL or Canvas2D based on the options.webgl flag.
*
* #param {Object} options
* #param {Number} options.passes The number of filters remaining to be executed
* #param {Boolean} options.webgl Whether to use webgl to render the filter.
* #param {WebGLTexture} options.sourceTexture The texture setup as the source to be filtered.
* #param {WebGLTexture} options.targetTexture The texture where filtered output should be drawn.
* #param {WebGLRenderingContext} options.context The GL context used for rendering.
* #param {Object} options.programCache A map of compiled shader programs, keyed by filter type.
*/
applyTo(options) {
if (options.webgl) {
const { width, height } = this.getPerspectiveBounds();
options.context.canvas.width = width;
options.context.canvas.height = height;
options.destinationWidth = width;
options.destinationHeight = height;
this.hasRelativeCoordinates && this.calculateCoordsByCorners();
this._setupFrameBuffer(options);
this.applyToWebGL(options);
this._swapTextures(options);
}
}
applyPixelRatio(coords = this.perspectiveCoords) {
for(let i = 0; i < coords.length; i++) {
coords[i][0] *= this.pixelRatio;
coords[i][1] *= this.pixelRatio;
}
return coords;
}
getPerspectiveBounds(coords = this.perspectiveCoords) {
coords = this.perspectiveCoords.slice().map(c => (
{
x: c[0],
y: c[1],
}
));
this.bounds.minX = fabric.util.array.min(coords, 'x') || 0;
this.bounds.minY = fabric.util.array.min(coords, 'y') || 0;
this.bounds.maxX = fabric.util.array.max(coords, 'x') || 0;
this.bounds.maxY = fabric.util.array.max(coords, 'y') || 0;
this.bounds.width = Math.abs(this.bounds.maxX - this.bounds.minX);
this.bounds.height = Math.abs(this.bounds.maxY - this.bounds.minY);
return {
width: this.bounds.width,
height: this.bounds.height,
minX: this.bounds.minX,
maxX: this.bounds.maxX,
minY: this.bounds.minY,
maxY: this.bounds.maxY,
};
}
/**
* #description coordinates are coming in relative to mockup item sections
* the following function normalizes the coords based on canvas corners
*
* #param {number[]} coords
*/
calculateCoordsByCorners(coords = this.perspectiveCoords) {
for(let i = 0; i < coords.length; i++) {
coords[i][0] -= this.bounds.minX;
coords[i][1] -= this.bounds.minY;
}
}
/**
* Apply this filter using webgl.
*
* #param {Object} options
* #param {Number} options.passes The number of filters remaining to be executed
* #param {Boolean} options.webgl Whether to use webgl to render the filter.
* #param {WebGLTexture} options.originalTexture The texture of the original input image.
* #param {WebGLTexture} options.sourceTexture The texture setup as the source to be filtered.
* #param {WebGLTexture} options.targetTexture The texture where filtered output should be drawn.
* #param {WebGLRenderingContext} options.context The GL context used for rendering.
* #param {Object} options.programCache A map of compiled shader programs, keyed by filter type.
*/
applyToWebGL(options) {
const gl = options.context;
const shader = this.retrieveShader(options);
const tess = this.generateSurface(options.sourceWidth, options.sourceHeight);
const indices = new Uint16Array(_.flatten(tess.faces));
// Clear the canvas first
this.clear(gl); // !important
// bind texture buffer
this.bindTexture(gl, options);
gl.useProgram(shader.program);
// create the buffer
this.indexBuffer(gl, indices);
this.sendAttributeData(gl, shader.attributeLocations, new Float32Array(_.flatten(tess.points)), 'aPosition');
this.sendAttributeData(gl, shader.attributeLocations, new Float32Array(_.flatten(tess.uvs)), 'aUvs');
gl.uniform1f(shader.uniformLocations.uStepW, 1 / gl.canvas.width);
gl.uniform1f(shader.uniformLocations.uStepH, 1 / gl.canvas.height);
this.sendUniformData(gl, shader.uniformLocations);
gl.viewport(0, 0, options.destinationWidth, options.destinationHeight);
// enable indices up to 4294967296 for webGL 1.0
gl.getExtension('OES_element_index_uint');
gl.drawElements(gl.TRIANGLES, indices.length, gl.UNSIGNED_SHORT, 0);
}
clear(gl) {
gl.clearColor(0, 0, 0, 0);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
}
bindTexture(gl, options) {
if (options.pass === 0 && options.originalTexture) {
gl.bindTexture(gl.TEXTURE_2D, options.originalTexture);
} else {
gl.bindTexture(gl.TEXTURE_2D, options.sourceTexture);
}
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
}
indexBuffer(gl, data) {
const indexBuffer = gl.createBuffer();
// make this buffer the current 'ELEMENT_ARRAY_BUFFER'
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);
// Fill the current element array buffer with data
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, data, gl.STATIC_DRAW);
}
};
/**
* Returns filter instance from an object representation
* #static
* #param {Object} object Object to create an instance from
* #param {function} [callback] to be invoked after filter creation
* #return {fabric.Image.filters.Perspective} Instance of fabric.Image.filters.Perspective
*/
fabric.Image.filters.Perspective.fromObject = fabric.Image.filters.BaseFilter.fromObject;
/**
* Photo subclass
* #class fabric.Photo
* #extends fabric.Photo
* #return {fabric.Photo} thisArg
*
*/
fabric.Photo = class extends fabric.Image {
type = 'photo';
repeat = 'no-repeat';
fill = 'transparent';
initPerspective = true;
cacheProperties = fabric.Image.prototype.cacheProperties.concat('perspectiveCoords');
constructor(src, options) {
super(options);
if (options) this.setOptions(options);
this.on('added', () => {
const image = new Image();
image.setAttribute('crossorigin', 'anonymous');
image.onload = () => {
this._initElement(image, options);
this.width = image.width / 2;
this.height = image.height / 2;
this.loaded = true;
this.setCoords();
this.fire('image:loaded');
};
image.src = src;
this.on('image:loaded', () => {
!this.perspectiveCoords && this.getInitialPerspective();
this.togglePerspective();
this.canvas.requestRenderAll();
});
});
}
cacheProperties = fabric.Image.prototype.cacheProperties.concat('perspectiveCoords');
/**
* #private
* #param {CanvasRenderingContext2D} ctx Context to render on
*//** #ts-ignore */
_render(ctx) {
fabric.util.setImageSmoothing(ctx, this.imageSmoothing);
if (this.isMoving !== true && this.resizeFilter && this._needsResize()) {
this.applyResizeFilters();
}
this._stroke(ctx);
this._renderPaintInOrder(ctx);
}
/**
* #private
* #param {CanvasRenderingContext2D} ctx Context to render on
*//** #ts-ignore */
_renderFill(ctx) {
var elementToDraw = this._element;
if (!elementToDraw) return;
ctx.save();
const elWidth = elementToDraw.naturalWidth || elementToDraw.width;
const elHeight = elementToDraw.naturalHeight || elementToDraw.height;
const width = this.width;
const height = this.height;
ctx.translate(-width / 2, -height / 2);
// get the scale
const scale = Math.min(width / elWidth, height / elHeight);
// get the top left position of the image
const x = (width / 2) - (elWidth / 2) * scale;
const y = (height / 2) - (elHeight / 2) * scale;
ctx.drawImage(elementToDraw, x, y, elWidth * scale, elHeight * scale);
ctx.restore();
}
togglePerspective(mode = true) {
this.set('perspectiveMode', mode);
// this.set('hasBorders', !mode);
if (mode === true) {
this.set('layout', 'fit');
var lastControl = this.perspectiveCoords.length - 1;
this.controls = this.perspectiveCoords.reduce((acc, coord, index) => {
const anchorIndex = index > 0 ? index - 1 : lastControl;
let name = `prs${index + 1}`;
acc[name] = new fabric.Control({
name,
x: -0.5,
y: -0.5,
actionHandler: this._actionWrapper(anchorIndex, (_, transform, x, y) => {
const target = transform.target;
const localPoint = target.toLocalPoint(new fabric.Point(x, y), 'left', 'top');
coord[0] = localPoint.x / target.scaleX * fabric.devicePixelRatio;
coord[1] = localPoint.y / target.scaleY * fabric.devicePixelRatio;
target.setCoords();
target.applyFilters();
return true;
}),
positionHandler: function (dim, finalMatrix, fabricObject) {
const zoom = fabricObject.canvas.getZoom();
const scalarX = fabricObject.scaleX * zoom / fabric.devicePixelRatio;
const scalarY = fabricObject.scaleY * zoom / fabric.devicePixelRatio;
var point = fabric.util.transformPoint({
x: this.x * dim.x + this.offsetX + coord[0] * scalarX,
y: this.y * dim.y + this.offsetY + coord[1] * scalarY,
}, finalMatrix
);
return point;
},
cursorStyleHandler: () => 'cell',
render: function(ctx, left, top, _, fabricObject) {
const zoom = fabricObject.canvas.getZoom();
const scalarX = fabricObject.scaleX * zoom / fabric.devicePixelRatio;
const scalarY = fabricObject.scaleY * zoom / fabric.devicePixelRatio;
ctx.save();
ctx.translate(left, top);
ctx.rotate(fabric.util.degreesToRadians(fabricObject.angle));
ctx.beginPath();
ctx.moveTo(0, 0);
ctx.strokeStyle = 'green';
if (fabricObject.perspectiveCoords[index + 1]) {
ctx.strokeStyle = 'green';
ctx.lineTo(
(fabricObject.perspectiveCoords[index + 1][0] - coord[0]) * scalarX,
(fabricObject.perspectiveCoords[index + 1][1] - coord[1]) * scalarY,
);
} else {
ctx.lineTo(
(fabricObject.perspectiveCoords[0][0] - coord[0]) * scalarX,
(fabricObject.perspectiveCoords[0][1] - coord[1]) * scalarY,
);
}
ctx.stroke();
ctx.beginPath();
ctx.arc(0, 0, 4, 0, Math.PI * 2);
ctx.closePath();
ctx.fillStyle = 'green';
ctx.fill();
ctx.stroke();
ctx.restore();
},
offsetX: 0,
offsetY: 0,
actionName: 'perspective-coords',
});
return acc;
}, {});
} else {
this.controls = fabric.Photo.prototype.controls;
}
this.canvas.requestRenderAll();
}
_actionWrapper(anchorIndex, fn) {
return function(eventData, transform, x, y) {
if (!transform || !eventData) return;
const { target } = transform;
target._resetSizeAndPosition(anchorIndex);
const actionPerformed = fn(eventData, transform, x, y);
return actionPerformed;
};
}
/**
* #description manually reset the bounding box after points update
*
* #see http://fabricjs.com/custom-controls-polygon
* #param {number} index
*/
_resetSizeAndPosition = (index, apply = true) => {
const absolutePoint = fabric.util.transformPoint({
x: this.perspectiveCoords[index][0],
y: this.perspectiveCoords[index][1],
}, this.calcTransformMatrix());
this._setPositionDimensions({});
const penBaseSize = this._getNonTransformedDimensions();
const newX = (this.perspectiveCoords[index][0]) / penBaseSize.x;
const newY = (this.perspectiveCoords[index][1]) / penBaseSize.y;
this.setPositionByOrigin(absolutePoint, newX + 0.5, newY + 0.5);
apply && this._applyPointsOffset();
}
/**
* This is modified version of the internal fabric function
* this helps determine the size and the location of the path
*
* #param {object} options
*/
_setPositionDimensions(options) {
const { left, top, width, height } = this._calcDimensions(options);
this.width = width;
this.height = height;
var correctLeftTop = this.translateToGivenOrigin(
{
x: left,
y: top,
},
'left',
'top',
this.originX,
this.originY
);
if (typeof options.left === 'undefined') {
this.left = correctLeftTop.x;
}
if (typeof options.top === 'undefined') {
this.top = correctLeftTop.y;
}
this.pathOffset = {
x: left,
y: top,
};
return { left, top, width, height };
}
/**
* #description this is based on fabric.Path._calcDimensions
*
* #private
*/
_calcDimensions() {
const coords = this.perspectiveCoords.slice().map(c => (
{
x: c[0] / fabric.devicePixelRatio,
y: c[1] / fabric.devicePixelRatio,
}
));
const minX = fabric.util.array.min(coords, 'x') || 0;
const minY = fabric.util.array.min(coords, 'y') || 0;
const maxX = fabric.util.array.max(coords, 'x') || 0;
const maxY = fabric.util.array.max(coords, 'y') || 0;
const width = Math.abs(maxX - minX);
const height = Math.abs(maxY - minY);
return {
left: minX,
top: minY,
width: width,
height: height,
};
}
/**
* #description This is modified version of the internal fabric function
* this subtracts the path offset from each path points
*/
_applyPointsOffset() {
for (let i = 0; i < this.perspectiveCoords.length; i++) {
const coord = this.perspectiveCoords[i];
coord[0] -= this.pathOffset.x;
coord[1] -= this.pathOffset.y;
}
}
/**
* #description generate the initial coordinates for warping, based on image dimensions
*
*/
getInitialPerspective() {
let w = this.getScaledWidth();
let h = this.getScaledHeight();
const perspectiveCoords = [
[0, 0], // top left
[w, 0], // top right
[w, h], // bottom right
[0, h], // bottom left
];
this.perspectiveCoords = perspectiveCoords;
const perspectiveFilter = new fabric.Image.filters.Perspective({
hasRelativeCoordinates: false,
pixelRatio: fabric.devicePixelRatio, // the Photo is already retina ready
perspectiveCoords
});
this.filters.push(perspectiveFilter);
this.applyFilters();
return perspectiveCoords;
}
};
/**
* Creates an instance of fabric.Photo from its object representation
* #static
* #param {Object} object Object to create an instance from
* #param {Function} callback Callback to invoke when an image instance is created
*/
fabric.Photo.fromObject = function(_object, callback) {
const object = fabric.util.object.clone(_object);
object.layout = _object.layout;
fabric.util.loadImage(object.src, function(img, isError) {
if (isError) {
callback && callback(null, true);
return;
}
fabric.Photo.prototype._initFilters.call(object, object.filters, function(filters) {
object.filters = filters || [];
fabric.Photo.prototype._initFilters.call(object, [object.resizeFilter], function(resizeFilters) {
object.resizeFilter = resizeFilters[0];
fabric.util.enlivenObjects([object.clipPath], function(enlivedProps) {
object.clipPath = enlivedProps[0];
var image = new fabric.Photo(img, object);
callback(image, false);
});
});
});
}, null, object.crossOrigin || 'anonymous');
};
const canvas = new fabric.Canvas(document.getElementById('canvas'), {
backgroundColor: 'white',
enableRetinaScaling: true,
});
function resizeCanvas() {
canvas.setWidth(window.innerWidth);
canvas.setHeight(window.innerHeight);
}
resizeCanvas();
window.addEventListener('resize', () => resizeCanvas(), false);
const photo = new fabric.Photo('https://cdn.artboard.studio/private/5cb9c751-5f17-4062-adb7-6ec2c137a65d/user_uploads/5bafe170-1580-4d6b-a3be-f5cdce22d17d-asdasdasd.jpg', {
left: canvas.getWidth() / 2,
top: canvas.getHeight() / 2,
originX: 'center',
originY: 'center',
});
canvas.add(photo);
canvas.setActiveObject(photo);
body {
margin: 0;
}
<script src="https://cdn.jsdelivr.net/npm/lodash#4.17.20/lodash.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/verb-nurbs-web#2.1.3/build/js/verb.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/fabric#4.3.0/dist/fabric.min.js"></script>
<canvas id="canvas"></canvas>
I suspect that the reference to absolutePoint in _resetSizeAndPosition needs to take into account the origin for the image and that there is a simple fix to this issue. However, I didn't find a good way to do this and resorted to manually "correcting" this issue in _resetSizeAndPosition.
The modified version of _resetSizeAndPosition looks like so:
_resetSizeAndPosition = (index, apply = true) => {
const absolutePoint = fabric.util.transformPoint({
x: this.perspectiveCoords[index][0],
y: this.perspectiveCoords[index][1],
}, this.calcTransformMatrix());
let { height, width, left, top } = this._calcDimensions({});
const widthDiff = (width - this.width) / 2;
if ((left < 0 && widthDiff > 0) || (left > 0 && widthDiff < 0)) {
absolutePoint.x -= widthDiff;
} else {
absolutePoint.x += widthDiff;
}
const heightDiff = (height - this.height) / 2;
if ((top < 0 && heightDiff > 0) || (top > 0 && heightDiff < 0)) {
absolutePoint.y -= heightDiff;
} else {
absolutePoint.y += heightDiff;
}
this._setPositionDimensions({});
const penBaseSize = this._getNonTransformedDimensions();
const newX = (this.perspectiveCoords[index][0]) / penBaseSize.x;
const newY = (this.perspectiveCoords[index][1]) / penBaseSize.y;
this.setPositionByOrigin(absolutePoint, newX + 0.5, newY + 0.5);
apply && this._applyPointsOffset();
}
The basic principle for this approach is that the left and top properties of the object are never being updated. This can be seen in your example through the console by modifying the image and checking the properties on the image. Therefore, we need to apply a correction to the position properties based on the changing width and height. This ensures that other points stay fixed in place, since we compensate for the changing height and width of the image in its position.
By comparing the values of width and this.width it's possible to determine if the image is increasing or decreasing in size. The value of left indicates whether the stretch is occurring to the left or right side of the image. If the user is stretching the image to the left or shrinking it from the right then we need. By combining the conditions for these, we can tell how we need to modify the position of the image to compensate. The same approach used for the horizontal values is also applied to the vertical values.
JSFiddle: https://jsfiddle.net/0x8caow6/

JSPDF leaves white background when margin added

I am using JsPDF to print dynamic resume data and i want to add margin when adding new page to pdf.
Now when resume having a background color and i add margin while generating pdf it leaves the margined area as white and rest is OK.
var data = document.getElementById('box');
html2canvas(data,{scale: 2}).then(canvas => {
var imgData = canvas.toDataURL('image/JPEG');
var imgWidth = 210;
var pageHeight = 295;
var imgHeight = canvas.height * imgWidth / canvas.width;
var heightLeft = imgHeight;
var doc = new jsPDF('p', 'mm', "a4");
var position = 1;
doc.addImage(imgData, 'JPEG', 0, position, imgWidth, imgHeight,'FAST');
heightLeft -= pageHeight;
while (heightLeft >= 0) {
position = heightLeft - imgHeight;
doc.addPage();
doc.addImage(imgData, 'JPEG', 0, position, imgWidth, imgHeight);
heightLeft -= pageHeight;
}
doc.save("Dashboard.pdf");
});
Use 'pt' instead of 'mm' as below, I hope it fixes,
var doc = new jsPDF('p', 'pt', "a4");

Html2Canvas doesn't works... the pdf shows empty

When I run the app and I click on the button, the PDF looks empty.
I was looking for by console.log() and the canvas doesn't show anything.
import { Component, OnInit } from '#angular/core';
import * as jspdf from 'jspdf';
import html2canvas from 'html2canvas';
generatePDF(){
html2canvas(document.getElementById('albaran')).then(canvas => {
// Few necessary setting options
var imgWidth = 208;
var pageHeight = 295;
var imgHeight = canvas.height * imgWidth / canvas.width;
var heightLeft = imgHeight;
const contentDataURL = canvas.toDataURL('image/png')
let pdf = new jspdf('p', 'mm', 'a4'); // A4 size page of PDF
var position = 0;
pdf.addImage(contentDataURL, 'PNG', 0, position, imgWidth, imgHeight)
pdf.save('MYPdf.pdf'); // Generated PDF
});
}
}
Finally, I have found a solution. I use jsPDF and dom-to-image libraries.
https://www.npmjs.com/package/jspdf
https://www.npmjs.com/package/dom-to-image
import * as jsPDF from 'jspdf';
import domtoimage from 'dom-to-image';
exportPdf(){
const div = document.getElementById('pdf');
const options = { background: 'white', height: 845, width: 595 };
domtoimage.toPng(div, options).then((dataUrl) => {
//Initialize JSPDF
const doc = new jsPDF('p', 'mm', 'a4');
//Add image Url to PDF
doc.addImage(dataUrl, 'PNG', 0, 0, 210, 340);
doc.save('pdfDocument.pdf');
}
}
Once you click on the button it will take time while loading an element from DOM so using setTimeout it will work.
import * as html2canvas from 'html2canvas';
import * as jspdf from 'jspdf';
generatePDF() {
setTimeout(() => {
const data = document.getElementById('printdiv');
html2canvas(data).then(canvas => {
// Few necessary setting options
const imgWidth = 208;
const pageHeight = 295;
const imgHeight = canvas.height * imgWidth / canvas.width;
let heightLeft = imgHeight;
const contentDataURL = canvas.toDataURL('image/png');
const pdf = new jspdf('p', 'mm', 'a4'); // A4 size page of PDF
let position = 0;
pdf.addImage(contentDataURL, 'PNG', 0, position, imgWidth, imgHeight);
heightLeft -= pageHeight;
// pdf.text(190, 294, '1');
let count = 1;
while (heightLeft >= 0) {
position = heightLeft - imgHeight;
pdf.addPage();
pdf.addImage(contentDataURL, 'PNG', 0, position, imgWidth, imgHeight);
// pdf.text(150, 10, 'this test meaasage');
count++;
// pdf.text(190, 294, count.toString());
heightLeft -= pageHeight;
}
const date = this.datePipe.transform(new Date(), 'dd/MM/yy');
const text = 'Created At :' + date;
pdf.setTextColor(163, 163, 163);
pdf.text(10, 290, text);
// pdf.text(190, 294, count.toString());
const currentuser = this.localstorgeservice.getCurrentUser();
const url = 'URL:' + this.document.location.href;
pdf.text(10, 280, url.toString());
pdf.text(150, 290, currentuser.userid);
pdf.save(this.bankingchannelname + '.pdf'); // Generated PDF
});
}, 700);
}
Here it is,
$(document).click(function () {
domtoimage.toPng(document.body)
.then(function (blob) {
var pdf = new jsPDF('p', 'mm', 'a4');
pdf.addImage(blob,'PNG', 0, 0, 210, 225);
pdf.save("test.pdf");
that.options.api.optionsChanged();
});
});

Center image doc.addImage jspdf

I am using html2canvas to take screenshot of my page and creating pdf of the images using jspdf. Now, my images are left aligned in the pdf document, I want it to be centered, how can I achieve it?
function pdfmaker() {
var element = $("#timesheet");
document.getElementById("message").style.display = "block";
document.getElementById("logo").style.display = "block";
var firstName = "<?php echo $fname?>";
var lastName = "<?php echo $lname ?>";
var startDate = "<?php echo $startDate?>";
var endDate = "<?php echo $endDate ?>";
html2canvas(element, {
useCORS: true,
onrendered: function(canvas) {
var imgData = canvas.toDataURL("image/png");
var imgWidth = 297; //297
var pageHeight = 297; //297
var imgHeight = canvas.height * imgWidth / canvas.width;
var heightLeft = imgHeight;
// var doc = new jsPDF('l', 'mm',[1350, 1350]);
var doc = new jsPDF('l', 'mm', [420, 297]); //420,297
var position = 5; //0
margins = {
top: 20,
bottom: 10,
left: 45,
width: 522
};
doc.addImage(imgData, 'PNG', 5, position, imgWidth, imgHeight);
heightLeft -= pageHeight;
while (heightLeft >= 5) {
position = heightLeft - imgHeight;
doc.addPage();
doc.addImage(imgData, 'PNG', 5, position, imgWidth, imgHeight);
heightLeft -= pageHeight;
}
doc.save(firstName + '_' + lastName + '_Summary_report_' + startDate + '_' + endDate + ".pdf");
}
});
document.getElementById("message").style.display = "none";
document.getElementById("logo").style.display = "none";
}
You need to define inside the addImage() method, using the coordinate parameters, see: http://raw.githack.com/MrRio/jsPDF/master/docs/module-addImage.html
This is the only way you can do it. For this, I suggest you use the following method doc.internal.pageSize.getWidth(); to calculate the excess values ​​about the image width, which will be centered.

Resources